1
From Sums to Integrals: Foundations of Continuous Random Variables
MATH005 Lesson 5
00:00
The transition from discrete to continuous random variables represents a monumental shift in perspective: from summing individual 'mass points' to measuring the smooth 'area' under a density curve. While discrete variables deal with countable outcomes, continuous variables model the infinite granularity of the real world—time, distance, and weight.

The Core Shift: Sums to Integrals

A random variable $X$ is continuous if there is a nonnegative function $f$, called the probability density function (PDF) of $X$, such that for any set of real numbers $B$:

$P\{X \in B\} = \int_B f(x) dx$

Crucially, this implies that for any specific value $a$, $P(X = a) = \int_a^a f(x) dx = 0$. In the continuous realm, we only speak of probabilities over intervals.

The PDF and CDF Symbiosis

The Cumulative Distribution Function (CDF) $F(x)$ acts as the accumulator of probability from negative infinity up to $x$:

The Relationship
$F(x) = P\{X \le x\} = \int_{-\infty}^{x} f(t) dt$
The Derivative
By the Fundamental Theorem of Calculus, the density is the rate at which probability accumulates:
$\frac{d}{dx}F(x) = f(x)$

Measures of Central Tendency

  • Expected Value: $E[X] = \int_{-\infty}^{\infty} xf(x) dx$
  • Median ($m$): The point that bisects the area, where $F(m) = \frac{1}{2}$.
  • Mode: The value of $x$ for which $f(x)$ attains its maximum.

The Limits of Summation

To appreciate the "Integrals" in our journey, contrast the discrete world—where we might find the Legendre theorem ($\sum_{k=1}^{\infty} 1/k^2 = \pi^2/6$) or complex logic for divisors (where for $D=k$, $k$ must divide both $X$ and $Y$ and $X/k, Y/k$ must be relatively prime)—with the continuous world. Here, we calculate variance as $Var(X) = E[(X - E[X])^2]$ and expectations of functions via $E[g(X)] = \int_{-\infty}^{\infty} g(x)f(x) dx$.

🎯 Key Insight
Expectation can also be viewed as the area between the CDF and the horizontal lines $y=0$ and $y=1$. For any random variable $Y$:
$E[Y] = \int_{0}^{\infty} P\{Y > y\} dy - \int_{0}^{\infty} P\{Y < -y\} dy$